The Method of Steepest Descent for Feedforward Artificial Neural Networks

نویسندگان

  • Muhammad Hanif
  • Md. Jashim Uddin
  • Md Abdul Alim
چکیده

In this paper, we implement the method of Steepest Descent in single and multilayer feedforward artificial neural networks. In all previous works, all the update weight equations for single or multilayer feedforward artificial neural networks has been calculated by choosing a single activation function for various processing unit in the network. We, at first, calculate the total error function separately for single and multilayer feedforward artificial neural networks and then calculate the three new update weight equations for taking different activation function in different processing unit separately single and multilayer feedforward artificial neural networks. An example is given to show usefulness of this implementation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the convergence speed of artificial neural networks in‎ ‎the solving of linear ‎systems

‎Artificial neural networks have the advantages such as learning, ‎adaptation‎, ‎fault-tolerance‎, ‎parallelism and generalization‎. ‎This ‎paper is a scrutiny on the application of diverse learning methods‎ ‎in speed of convergence in neural networks‎. ‎For this aim‎, ‎first we ‎introduce a perceptron method based on artificial neural networks‎ ‎which has been applied for solving a non-singula...

متن کامل

On the use of back propagation and radial basis function neural networks in surface roughness prediction

Various artificial neural networks types are examined and compared for the prediction of surface roughness in manufacturing technology. The aim of the study is to evaluate different kinds of neural networks and observe their performance and applicability on the same problem. More specifically, feed-forward artificial neural networks are trained with three different back propagation algorithms, ...

متن کامل

Equivariant nonstationary source separation

Most of source separation methods focus on stationary sources, so higher-order statistics is necessary for successful separation, unless sources are temporally correlated. For nonstationary sources, however, it was shown [Neural Networks 8 (1995) 411] that source separation could be achieved by second-order decorrelation. In this paper, we consider the cost function proposed by Matsuoka et al. ...

متن کامل

Faster Convergence and Improved Performance in Least-Squares Training of Neural Networks for Active Sound Cancellation

This paper introduces new recursive least-squares algorithms with faster convergence and improved steady-state performance for the training of multilayer feedforward neural networks, used in a two neural networks structure for multichannel nonlinear active sound cancellation. Non-linearity in active sound cancellation systems is mostly found in actuators. The paper introduces the main concepts ...

متن کامل

A Framework for the Development of Globally Convergent Adaptive Learning Rate Algorithms

In this paper we propose a framework for developing globally convergent batch training algorithms with adaptive learning rate. The proposed framework provides conditions under which global convergence is guaranteed for adaptive learning rate training algorithms. To this end, the learning rate is appropriately tuned along the given descent direction. Providing conditions regarding the search dir...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014